Search results for "dimension reduction"

showing 10 items of 15 documents

Dimension Estimation in Two-Dimensional PCA

2021

We propose an automated way of determining the optimal number of low-rank components in dimension reduction of image data. The method is based on the combination of two-dimensional principal component analysis and an augmentation estimator proposed recently in the literature. Intuitively, the main idea is to combine a scree plot with information extracted from the eigenvectors of a variation matrix. Simulation studies show that the method provides accurate estimates and a demonstration with a finger data set showcases its performance in practice. peerReviewed

Computer sciencebusiness.industrydimension reductionDimensionality reductionimage dataEstimatorPattern recognitiondimension estimation16. Peace & justiceImage (mathematics)Data modelingData setMatrix (mathematics)scree plotPrincipal component analysisaugmentationArtificial intelligencebusinessEigenvalues and eigenvectors
researchProduct

Data-driven analysis for fMRI during naturalistic music listening

2017

Interest towards higher ecological validity in functional magnetic resonance imaging (fMRI) experiments has been steadily growing since the turn of millennium. The trend is reflected in increasing amount of naturalistic experiments, where participants are exposed to the real-world complex stimulus and/or cognitive tasks such as watching movie, playing video games, or listening to music. Multifaceted stimuli forming parallel streams of input information, combined with reduced control over experimental variables introduces number of methodological challenges associated with isolating brain responses to individual events. This exploratory work demonstrated some of those methodological challeng…

PCAfMRIdimension pienennysmusiikkisignaalianalyysikognitiiviset prosessitkuunteleminenpääkomponenttianalyysinaturalistic experimenttoiminnallinen magneettikuvausICACCAaivotkernel PCA dimension reduction
researchProduct

Efficient unsupervised clustering for spatial bird population analysis along the Loire river

2015

International audience; This paper focuses on application and comparison of Non Linear Dimensionality Reduction (NLDR) methods on natural high dimensional bird communities dataset along the Loire River (France). In this context, biologists usually use the well-known PCA in order to explain the upstream-downstream gradient.Unfortunately this method was unsuccessful on this kind of nonlinear dataset.The goal of this paper is to compare recent NLDR methods coupled with different data transformations in order to find out the best approach. Results show that Multiscale Jensen-Shannon Embedding (Ms JSE) outperform all over methods in this context.

Clustering Algorithms[ INFO.INFO-TS ] Computer Science [cs]/Signal and Image Processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing[INFO.INFO-TS] Computer Science [cs]/Signal and Image ProcessingNonlinear dimension reductionMultiscale Jensen-Shannon EmbeddingDimension ReductionLoire River
researchProduct

Dimension reduction for time series in a blind source separation context using r

2021

Funding Information: The work of KN was supported by the CRoNoS COST Action IC1408 and the Austrian Science Fund P31881-N32. The work of ST was supported by the CRoNoS COST Action IC1408. The work of JV was supported by Academy of Finland (grant 321883). We would like to thank the anonymous reviewers for their comments which improved the paper and package considerably. Publisher Copyright: © 2021, American Statistical Association. All rights reserved. Multivariate time series observations are increasingly common in multiple fields of science but the complex dependencies of such data often translate into intractable models with large number of parameters. An alternative is given by first red…

Statistics and ProbabilitySeries (mathematics)Stochastic volatilityComputer scienceblind source separation; supervised dimension reduction; RsignaalinkäsittelyDimensionality reductionRsignaalianalyysiContext (language use)CovarianceBlind signal separationQA273-280aikasarja-analyysiR-kieliDimension (vector space)monimuuttujamenetelmätBlind source separationStatistics Probability and UncertaintyTime seriesAlgorithmSoftwareSupervised dimension reduction
researchProduct

3D-2D dimensional reduction for a nonlinear optimal design problem with perimeter penalization

2012

A 3D-2D dimension reduction for a nonlinear optimal design problem with a perimeter penalization is performed in the realm of $\Gamma$-convergence, providing an integral representation for the limit functional.

Optimal designMathematical optimizationIntegral representationdimension reductionDimensionality reductionGeneral Medicinedimension reduction; optimal designPerimeterNonlinear systemMathematics - Analysis of PDEsDimensional reductionConvergence (routing)FOS: MathematicsApplied mathematicsLimit (mathematics)optimal designDimensional reductionMathematicsAnalysis of PDEs (math.AP)
researchProduct

Signal dimension estimation in BSS models with serial dependence

2022

Many modern multivariate time series datasets contain a large amount of noise, and the first step of the data analysis is to separate the noise channels from the signals of interest. A crucial part of this dimension reduction is determining the number of signals. In this paper we approach this problem by considering a noisy latent variable time series model which comprises many popular blind source separation models. We propose a general framework for the estimation of the signal dimension that is based on testing for sub-sphericity and give examples of different tests suitable for time series settings. In the inference we rely on bootstrap null distributions. Several simulation studies are…

nonstationary source separationdimension reductionsignaalinkäsittelyaikasarjatsub-sphericitysecond order source separationblock bootstrapaikasarja-analyysi2022 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME)
researchProduct

Combining PCA and multiset CCA for dimension reduction when group ICA is applied to decompose naturalistic fMRI data

2015

An extension of group independent component analysis (GICA) is introduced, where multi-set canonical correlation analysis (MCCA) is combined with principal component analysis (PCA) for three-stage dimension reduction. The method is applied on naturalistic functional MRI (fMRI) images acquired during task-free continuous music listening experiment, and the results are compared with the outcome of the conventional GICA. The extended GICA resulted slightly faster ICA convergence and, more interestingly, extracted more stimulus-related components than its conventional counterpart. Therefore, we think the extension is beneficial enhancement for GICA, especially when applied to challenging fMRI d…

ta113MultisetPCAGroup (mathematics)business.industrydimension reductionSpeech recognitionDimensionality reductionPattern recognitionMusic listeningta3112naturalistic fMRIGroup independent component analysisPrincipal component analysistemporal cocatenationArtificial intelligenceCanonical correlationbusinessmultiset CCAMathematics
researchProduct

Dimensional reduction for energies with linear growth involving the bending moment

2008

A $\Gamma$-convergence analysis is used to perform a 3D-2D dimension reduction of variational problems with linear growth. The adopted scaling gives rise to a nonlinear membrane model which, because of the presence of higher order external loadings inducing a bending moment, may depend on the average in the transverse direction of a Cosserat vector field, as well as on the deformation of the mid-plane. The assumption of linear growth on the energy leads to an asymptotic analysis in the spaces of measures and of functions with bounded variation.

Mathematics(all)Asymptotic analysis49J45 49Q20 74K35dimension reductionGeneral Mathematics01 natural sciencesMathematics - Analysis of PDEsTangent measures; bending moments; dimension reductionFOS: Mathematics[MATH.MATH-AP]Mathematics [math]/Analysis of PDEs [math.AP]0101 mathematicsScalingFunctions of bounded variationMathematicsDeformation (mechanics)Applied Mathematics010102 general mathematicsMathematical analysisTangent measures010101 applied mathematicsNonlinear systemΓ-convergenceDimensional reductionBounded variationBending momentbending momentsVector fieldMSC: 49J45; 49Q20; 74K35Analysis of PDEs (math.AP)
researchProduct

On the usage of joint diagonalization in multivariate statistics

2022

Scatter matrices generalize the covariance matrix and are useful in many multivariate data analysis methods, including well-known principal component analysis (PCA), which is based on the diagonalization of the covariance matrix. The simultaneous diagonalization of two or more scatter matrices goes beyond PCA and is used more and more often. In this paper, we offer an overview of many methods that are based on a joint diagonalization. These methods range from the unsupervised context with invariant coordinate selection and blind source separation, which includes independent component analysis, to the supervised context with discriminant analysis and sliced inverse regression. They also enco…

Statistics and ProbabilityScatter matricesMultivariate statisticsContext (language use)010103 numerical & computational mathematics01 natural sciencesBlind signal separation010104 statistics & probabilitySliced inverse regression0101 mathematicsB- ECONOMIE ET FINANCESupervised dimension reductionMathematicsNumerical Analysisbusiness.industryCovariance matrixPattern recognitionriippumattomien komponenttien analyysimatemaattinen tilastotiedeLinear discriminant analysisInvariant component selectionIndependent component analysismonimuuttujamenetelmätPrincipal component analysisDimension reductionBlind source separationArtificial intelligenceStatistics Probability and Uncertaintybusiness
researchProduct

Linear Feature Extraction for Ranking

2018

We address the feature extraction problem for document ranking in information retrieval. We then propose LifeRank, a Linear feature extraction algorithm for Ranking. In LifeRank, we regard each document collection for ranking as a matrix, referred to as the original matrix. We try to optimize a transformation matrix, so that a new matrix (dataset) can be generated as the product of the original matrix and a transformation matrix. The transformation matrix projects high-dimensional document vectors into lower dimensions. Theoretically, there could be very large transformation matrices, each leading to a new generated matrix. In LifeRank, we produce a transformation matrix so that the generat…

dimension reductionComputer scienceFeature extractionMathematicsofComputing_NUMERICALANALYSISFeature selectiontiedonhakujärjestelmät02 engineering and technologyLibrary and Information SciencesRanking (information retrieval)Matrix (mathematics)Transformation matrix020204 information systemsalgoritmit0202 electrical engineering electronic engineering information engineeringtiedonhakulearning to rankbusiness.industryfeature extractionPattern recognitionkoneoppiminenPattern recognition (psychology)Benchmark (computing)020201 artificial intelligence & image processingLearning to rankArtificial intelligencebusinessInformation Systems
researchProduct